Thermodynamics looks for statistical patterns in groups of particles. A group of particles are called a system. Systems have properties that single particles don't, like temperature, pressure, volume, density, and entropy.
Unlike other physics topics we've learned, thermodynamics is statistical. It's predictions only have a probability of being true. It is never 100% certain. But, the accuracy increases with more particles, and there are a very large number of particles in most systems.
strategy
Calculate the volume in cubic meters (width × height × depth).
Convert the cubic meters into moles with a conversion fraction.
Convert the moles into particles with another conversion fraction.
mol conversions
1 mol = 0.0224 m³ of gas at STP
1 mol = 6.02 × 1023 particles
0.0224 m³ of gas at STP = 6.02 × 1023 particles
solution
$$\text{volume} = 10\,\mathrm{m} \times 10\,\mathrm{m} \times 4\,\mathrm{m}= 400 \, \mathrm{m^3}$$ $$ 400 \, \mathrm{m^3} \left( \frac{1 \, \mathrm{mol}}{0.0224 \, \mathrm{m^3}} \right) = 18\,000 \, \mathrm{mol}$$ $$ 18\,000 \, \mathrm{mol} \left( \frac{ 6.02 \times 10^{23}}{1 \, \mathrm{mol}} \right) = 1.08 \times 10^{28}$$ $$\small 10\,000\,000\,000\,000\,000\,000\,000\,000\,000\,\,\, \normalsize\text{gas particles}$$Temperature is a statistical property of a system of particles. We can measure temperature by looking at the expansion of fluids. At high temperatures fluids take up more space. This increases the level of the alcohol in a thermometer.
Temperature is often defined as proportional to the average kinetic energy of a system of particles. The kinetic energies that contribute to temperature can be stored in a particle's spin, vibrations, and motion.
temperature =
Heat
Heat is energy transferred from one system to another. This is often done through thermal energy, but it also travel in other forms, like radiation, conduction, or convection. A system doesn't "have" heat, it has internal energy. Heat is only heat when it transfers between systems.
An example of heat exiting a system is friction. If you throw a foam ball, heat will leave the ball and enter the air. The ball's kinetic energy converts into the thermal energy of the air.
The concept of heat plays a role in all machines like computers, engines, and refrigerators.
answer
Metal is a conductor of both electricity and heat. This means that if you are hotter than the metal, heat will flow out of your body into the metal at a very fast rate.
Wood is a poor conductor of heat so it doesn't pull as much heat from your body.
Energy is conserved. This means that if we add up the total energy of a system it will always be the same value, unless some energy enters or leaves the system we are tracking.
Turn friction on and observe what happens.
You should notice that the kinetic energy leaves the ball. We call energy that leaves a system heat. Heat energy can take many forms, but generally it's thermal energy, the kinetic energy of many microscopic particles.
The total energy of an isolated system is conserved, but it's not when energy leaves or enters that system. The first law of thermodynamics says that the change in internal energy of a system is equal to the total heat and work entering and exiting the system. $$\Delta E = Q - W$$ We can make it look more like our familiar conservation of energy equation by unpacking the ΔE into final and initial energy.
$$E_i = E_f + Q - W$$
\( E \) = total internal energy of a system [J]\( Q \) = energy exiting the system as heat[J]
heat entering the system would be negative
\( W \) = work done on the system[J]
work done by the system would be negative
Heat is mostly used to represent thermal energy, and work is mostly used to represent various mechanical energies like: kinetic, gravitational, and elastic.
In this example use the conservation of energy equation, but add Q on the final side to represent heat exiting the system as friction.
$$K_i + U_i = K_f + U_f + Q$$ Example: A 0.43 kg soccer ball, kicked at 10 m/s, rolls down a 30 m tall hill. If there was no energy loss, the ball would have a final velocity of 26.2 m/s.When performing the experiment we found the ball only has a speed of 20.0 m/s at the bottom. Calculate the heat that left the ball as it rolled down the hill.
solution
$$K_i + U_i = K_f + Q$$ $$\tfrac{1}{2}mu^2 + mgh_i = \tfrac{1}{2}mv^2 + Q$$ $$\tfrac{1}{2}(0.43)(10.0)^2 + (0.43)(9.8)(30) = \tfrac{1}{2}(0.43)(20)^2 + Q$$ $$61.9 \,\mathrm{J} = Q$$solution
$$K_i + U_i = K_f + U_f + Q$$ $$K = Q$$ $$\tfrac{1}{2}mv^2 = Q$$ $$\tfrac{1}{2}(2000)(30)^2 = Q$$ $$900\,000 \, \mathrm{J} = Q$$Example: If the coefficient of kinetic friction is 0.7, how far did the car skid?
hint
Instead of energy leaving as heat, think of the car slowing down because the car did work through the friction force. $$E_i = E_f + Q - W$$ $$K_i + U_i = K_f + U_f + Q - W$$ $$K = W$$In case you forgot, here are the work and friction equations. Also, remember for a flat surface the normal force will equal the force of gravity.
$$ W=F\Delta x \quad \quad F_k=F_N\mu_k \quad \quad F_N=F_g=mg$$solution
$$E_i = E_f + Q - W$$ $$K_i + U_i = K_f + U_f - W$$ $$K = W$$ $$W=F\Delta x$$ $$W=F_k\Delta x$$ $$W=F_N\mu_k\Delta x$$ $$W=mg\mu_k\Delta x$$ $$\Delta x = \frac{W}{mg\mu_k}$$ $$\Delta x = \frac{900000}{(2000)(9.8)(0.7)}$$ $$\Delta x = 65.6\,\mathrm{m}$$(Each grid square is 10 m × 10 m.)
solution
$$E_{\mathrm{point \, A}} = E_{\mathrm{point \, C}}$$ $$ U_g = K + U_g + Q $$ $$ mgh = \tfrac{1}{2}mv^2 + mgh + Q$$ $$ (1000)(9.8)(100) = \tfrac{1}{2}(1000)v^2 + (1000)(9.8)(47) + 57000 $$ $$ 980000 = 500v^2 + 460600 + 57000 $$ $$ 503400 = 500v^2$$ $$ 30.4\, \mathrm{\tfrac{m}{s}} = v$$Entropy
Entropy is hard to define because different fields of study use the term to describe slightly different ideas. The definitions of entropy are all loosely centered around disorder. A high entropy system of particles is not organized. A low entropy system is orderly, but how do you measure order?
A common definition of order is a concentration of energy. This means a concentration of energy has lower entropy than an evenly spread out distribution.
macrostates vs. microstates
In statistical mechanics entropy is defined as information. It's calculated by counting the number of microstates that could produce a macrostate. A macrostate is a collection of measurable properties for a system of particles, like their temperature, pressure, or volume. A microstate is a possible configuration of particles that could produce those measurable properties.
The macrostate of a disorderly system could arise from many possible microstates. In other words there are many ways things could be arranged and still be disorderly. On the other hand the macrostate of an orderly system has fewer possible microstates.
For example, the macrostate of some water atoms could be "20 Celsius and a pressure of 1 atmosphere". That one macrostate could originate from many different possible microstates. The microstates each include the atom's velocities and positions and many other attributes, but in the macrostate these attributes are ignored in order to focus on what's easy to measure, temperature and pressure. A system that has many possible microstates that make a macrostate is high entropy. If only a few microstates could produce a macrostate the entropy is low.
Not all energy can make an engine do work. Just making an engine hot adds energy to the engine, but the energy needs to be concentrated in the right place. Engines need energy that is in a specific orderly state. Entropy allows us to measure useable energy, and make a model of how engines do work.
The simulation below shows a simple engine that is designed to turn a rotor. What state of the system would make the rotor turn?
Simulation: First, click to remove the particles. Next, click inside the simulation a few times to add some particles. Try to place the particles in a state that will turn the rotor.
results
The rotor turns when you release a high concentration of particles on one side. Concentrated particles are a lower entropy state. As the engine does the work of rotating, particles spread out, and entropy increases
Once the particles are evenly spread out, the rotor will stop turning. If your system gets in this high entropy state, adjust the slider on the right to power the rotor from an external power source. The powered rotor lowers entropy by reconcentrating the particles.
We can view a car engine as a thermodynamic system. An engine takes in concentrated, low entropy, energy in the form of gasoline. An energy difference is produced when the gasoline goes through combustion. That difference in energy concentration is then transformed in rotational energy, which makes the car move.
Energy is only useful, when it is in a lower entropy state than it's surroundings.
Question: You can think about a human body as a thermodynamic system. As we do work, the entropy in our body increases. What keeps our entropy low?
answer
We add low entropy substances to our system, like food, oxygen, and water.
We also lower our entropy by removing high entropy substances, like carbon dioxide, urine, and feces.
Question: The planet Earth and all it's life can be viewed as a thermodynamic system. As time goes forward the energy density of Earth spreads out and loses the ability to do work. What keeps the entropy of the Earth low?
answer
Light from the Sun is added to our system. Sunlight lowers our entropy on the global scale by creating differences in temperature. On the atomic scale photons of sunlight reduce entropy through photosynthesis.
The Arrow of Time
There is no arrow of space. Space is generally uniform in every direction. Yet, time does seem to have an arrow. The past is clearly different from the future. So we say there is an arrow of time. Time has a direction.
If you watched a video of a glass of water falling off a table, you would know if the video was in reverse. It would be strange to see broken glass and spilled water spontaneously come together and jump up onto a table.
Yet, sometimes the direction of time isn't clear. A video of the Earth orbiting the Sun looks similar forwards and backwards. It has no clear arrow of time.
Question: Imagine each of these situations happening in reverse. Which ones would look obviously backwards?answer
Situations with a clear arrow of time are in red.
Question: What do the situations with a clear arrow of time have in common?
answer
The situations with a clear arrow of time are complex with many moving parts. These might include systems with friction, or with living creatures.
The systems with no clear arrow of time are all simple with few moving parts.
The laws of physics are time reversible. Nature behaves the same moving forwards or backwards in time. The past doesn't produce the future. A cause and effect could be equally thought of as an effect producing a cause.
This is clear for systems with just a few parts. Low complexity systems don't have enough possible states to progress in a direction. Yet, for systems with many parts a direction of time seems to emerge. This is because as complex systems change they tend to get more chaotic.
The entropy of a system increases over time. This holds for all the various definitions of entropy. As time moves forward, a system will almost always stay the same or get messier.
Why is the second law of thermodynamics true? It's because there are many ways for the energy of a system to be evenly distributed, but few ways to have energy concentrated. As a system progresses forwards in time, there is a nonzero probability of evolving into low entropy state, but it's very low. Most of the time the system will increase in entropy.
The entropy of a complex system almost always increases. Our universe began at its lowest entropy. The far future is the highest entropy.
Diffusion is a good example of the second law of thermodynamics. The simulation below will become more evenly mixed as time progresses forward. It would be extremely unlikely to see the reverse where the system separates as time progresses forward. This means there is a clear difference between directions in time.
The simulation below has 150 particles on each side. As time progresses forward, the particles quickly mix together. Yet, the odds of the 300 particles unmixing is so unlikely that it would take longer than the age of the universe.
Investigation
Let's say that each particle can be in only 2 states: left or right.
We can count the total number ways to put particles in one of 2 states.
number of particles | total possible states |
---|---|
\(0\) | \(1\) |
\(1\) | \(2 = 2\) |
\(2\) | \(2 \times 2 = 4\) |
\(3\) | \(2 \times 2 \times 2 = 8\) |
\(4\) | \(2 \times 2 \times 2 \times 2 = 16\) |
\(5\) | \(2 \times 2 \times 2 \times 2 \times 2 = 32\) |
\(\text{\color{red}n}\) | \(2^{\color{red}n}\) |
\(20\) | \(2^{20} = 1\,048\,576\) |
Trying random configurations would take on average half the number of total states to reach our unique unmixed state.
$$\frac{2^{n}}{2}$$I wrote a program that counts how often a particle in the simulation switches states at the fastest time rate. For ten particles on each sides there are about 18.75 switches per second or one every 1/18.75 seconds.
$$T_{avg} \approx \left(\frac{2^{20}}{2}\right) \left(\frac{1}{18.75}\right) \mathrm{s}$$ $$T_{avg} \approx 27\,960 \, \mathrm{s} = 7.767 \, \mathrm{hr}$$The graphs below assume that each particle switches states once a second. The time to separate increases quickly as the number of particles goes up.
For up to about n = 14 particles, the estimated time to separate takes seconds. Waiting for n = 35 particles to spontaneously separate takes years.
How long would it take n = 300 particles to randomly separate?
How long would it take all the air molecules in a typical room?
(n = 10 000 000 000 000 000 000 000 000 000 air
molecules)
Thinking about time is exciting, but confusing. Let's review.